Adaptive function-on-scalar regression with a smoothing elastic net

نویسندگان

چکیده

This paper presents a new methodology, called AFSSEN, to simultaneously select significant predictors and produce smooth estimates in high-dimensional function-on-scalar linear model with sub-Gaussian errors. Outcomes are assumed lie general real separable Hilbert space, H, while parameters subspace known as Cameron–Martin K, which closely related Reproducing Kernel Spaces, so that the parameter inherit particular properties, such smoothness or periodicity, without enforcing properties on data. We propose regularization method style of an adaptive Elastic Net penalty involves mixing two types functional norms, providing fine tune control both smoothing variable selection estimated model. Asymptotic theory is provided form oracle property, concludes simulation study demonstrating advantages using AFSSEN over existing methods terms prediction error selection.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Quantile Regression with Adaptive Elastic Net Penalty for Longitudinal Data

Longitudinal studies include the important parts of epidemiological surveys, clinical trials and social studies. In longitudinal studies, measurement of the responses is conducted repeatedly through time. Often, the main goal is to characterize the change in responses over time and the factors that influence the change. Recently, to analyze this kind of data, quantile regression has been taken ...

متن کامل

Robust Elastic Net Regression

We propose a robust elastic net (REN) model for high-dimensional sparse regression and give its performance guarantees (both the statistical error bound and the optimization bound). A simple idea of trimming the inner product is applied to the elastic net model. Specifically, we robustify the covariance matrix by trimming the inner product based on the intuition that the trimmed inner product c...

متن کامل

Elastic net orthogonal forward regression

An efficient two-level model identification method aiming at maximising a model's generalisation capability is proposed for a large class of linear-in-the-parameters models from the observational data. A new elastic net orthogonal forward regression (ENOFR) algorithm is employed at the lower level to carry out simultaneous model selection and elastic net parameter estimation. The two regularisa...

متن کامل

Adaptive Elastic Net: An Improvement of Elastic Net

Lasso proved to be an extremely successful technique for simultaneous estimation and variable selection. However lasso has two major drawbacks. First, it does not capture any grouping effect and secondly in some situations lasso solutions are inconsistent. To overcome inconsistency recently adaptive lasso was proposed where adaptive weights are used for penalizing different coefficients. Adapti...

متن کامل

An Elastic Net Orthogonal Forward Regression Algorithm

In this paper we propose an efficient two-level model identification method for a large class of linear-in-the-parameters models from the observational data. A new elastic net orthogonal forward regression (ENOFR) algorithm is employed at the lower level to carry out simultaneous model selection and elastic net parameter estimation. The two regularization parameters in the elastic net are optim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Multivariate Analysis

سال: 2021

ISSN: ['0047-259X', '1095-7243']

DOI: https://doi.org/10.1016/j.jmva.2021.104765